Web Survey Bibliography
This study investigates how an onscreen virtual agent's dialog capability and facial animation affect survey respondents' comprehension and engagement in “face-to-face” interviews, using questions from US government surveys whose results have far-reaching impact on national policies. In the study, 73 laboratory participants were randomly assigned to respond in one of four interviewing conditions, in which the virtual agent had either high or low dialog capability (implemented through Wizard of Oz) and high or low facial animation, based on motion capture from a human interviewer. Respondents, whose faces were visible to the Wizard (and videorecorded) during the interviews, answered 12 questions about housing, employment, and purchases on the basis of fictional scenarios designed to allow measurement of comprehension accuracy, defined as the fit between responses and US government definitions. Respondents answered more accurately with the high-dialog-capability agents, requesting clarification more often particularly for ambiguous scenarios; and they generally treated the high-dialog-capability interviewers more socially, looking at the interviewer more and judging high-dialog-capability agents as more personal and less distant. Greater interviewer facial animation did not affect response accuracy, but it led to more displays of engagement—acknowledgments (verbal and visual) and smiles—and to the virtual interviewer's being rated as less natural. The pattern of results suggests that a virtual agent's dialog capability and facial animation differently affect survey respondents' experience of interviews, behavioral displays, and comprehension, and thus the accuracy of their responses. The pattern of results also suggests design considerations for building survey interviewing agents, which may differ depending on the kinds of survey questions (sensitive or not) that are asked.
Web survey bibliography - Schober, M. F. (14)
- Respondent mode choice in a smartphone survey ; 2017; Conrad, F. G., Schober, M. F., Antoun, C., Yan, H. Y., Hupp, A., Johnston, M., Ehlen, P., Vickers, L...
- Comparisons of Online Recruitment Strategies for Convenience Samples: Craigslist, Google AdWords, Facebook...; 2016; Antoun, C., Zhang, C., Conrad, F. G., Schober, M. F.
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- Social Media Analyses for Social Measurement; 2016; Schober, M. F.; Pasek, J.; Guggenheim, L.; Lampe, C.; Conrad, F. G.
- Mobile Technologies for Conducting, Augmenting and Potentially Replacing Surveys: Report of the AAPOR...; 2014; Link, M. W., Murphy, J., Schober, M. F., Buskirk, T. D., Childs, J. H., Tesfaye, C.
- Effects of Self-Awareness on Disclosure During Skype Survey Interviews; 2013; Feuer, S., Schober, M. F.
- Disfluencies and Gaze Aversion in Unreliable Responses to Survey Questions; 2012; Schober, M. F., Conrad, F. G., Dijkstra, W., Ongena, Y. P.
- Race-of-Virtual-Interviewer Effects; 2011; Conrad, F. G., Schober, M. F., Nielsen, D.
- Which Web Survey Respondents Are Most Likely to Click for Clarification?; 2011; Coiner, T., Schober, M. F., Conrad, F. G.
- Envisioning the Survey Interview of the Future ; 2009; Conrad, F. G., Schober, M. F.
- Social Cues Can Affect Answers to Threatening Questions in Virtual Interviews; 2008; Lind, L. H., Schober, M. F., Conrad, F. G.
- Virtual Interviews on Mundane, Non-Sensitive Topics: Dialog Capability Affects Response Accuracy More...; 2008; Conrad, F. G., Schober, M. F., Jans, M., Orlowski, R. A., Nielsen, D.
- Surveys interviews and new communication technologies; 2007; Schober, M. F., Conrad, F. G.
- Promoting Uniform Question Understanding in Today's and Tomorrow's Surveys; 2005; Conrad, F. G., Schober, M. F.